skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Laefer, Debra"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Introduction: Without community-based, data-aggregation tools, timely and meaningful local input into brownfield management is not tenable, as relevant data are dispersed and often incomplete. in response, this project lays the groundwork through which constructive dialogue between community members and local officials can be facilitated.Materials and methods: a Brownfield engagement tool (Bet) is envisioned as a means by which non-experts can use disparately held open data streams to collect, analyse, and visualise brownfield site data, better understand aggregate health risks, and provide direct input into remediation and redevelopment decisions. By raising awareness and providing knowledge about brownfield related issues, the Bet is intended as a means to encourage community member participation in public debate. this concept is demonstrated for a 113-hectare Brooklyn, New York neighbourhood with a long history of industrial and mixed-use development resulting in 18 brownfields. the proposed remediation prioritization strategy offers a systematic analysis of the sites’ size, contaminants, and proximity to gathering spots and demographics.Results: the Bet proposed in this paper offers a novel approach for community-based management of brownfields done at the census tract level and based on factors that most affect the local community. By combining publicly-available, municipal, state, and federal data in the Bet, a set of easy-to-understand metrics can be generated through which a community can compare and rank existing brownfields to prioritize future interventions and can be used as a support system for raising funding and investments to address neighbourhood issues. this type of approach is the first of its kind with respect to brownfield redevelopment. 
    more » « less
    Free, publicly-accessible full text available December 31, 2026
  2. Unlike aboveground utility systems, for which very detailed and accurate information exists, there is generally a dearth of good-quality data about underground utility infrastructures that provide vital services. To identify key strategies to improve the resilience of these underground systems, this paper presents mechanisms for successful engagement and collaboration among stakeholders and shared cross-sector system vulnerability concerns (including data availability) based on the innova- tive use of focus groups. Outputs from two virtual focus groups were used to obtain information from New York City area utilities and other stakeholders affected by underground infrastructure. There was strong agreement among participants that (1) a trusted agency in New York City government should manage a detailed map of underground infrastructure that would allow stakeholders to securely access appropriate information about underground systems on a need-to-know basis; (2) environmental risk factors, such as infrastructure age and condition, as well as location should be included; and (3) improved mechanisms for collaboration and sharing information are needed, especially during non-emergency situations. Stakeholders also highlighted the need for a regularly updated central database of relevant contacts at key organizations, since institutions often have a high employee turnover rate, which creates knowledge loss. The focus group script developed as part of this research was designed to be transferable to other cities to assess data needs and potential obstacles to stakeholder collabora- tion in the areas of underground infrastructure mapping and modeling. 
    more » « less
    Free, publicly-accessible full text available March 1, 2026
  3. This paper proposes a flood risk visualization method that is (1) readily transferable (2) hyperlocal, (3) computationally inexpensive, and (4) geometrically accurate. This proposal is for risk communication, to provide high-resolution, three-dimensional flood visualization at the sub-meter level. The method couples a laser scanning point cloud with algorithms that produce textured floodwaters, achieved through compounding multiple sine functions in a graphics shader. This hyper-local approach to visualization is enhanced by the ability to portray changes in (i) watercolor, (ii) texture, and (iii) motion (including dynamic heights) for various flood prediction scenarios. Through decoupling physics-based predictions from the visualization, a dynamic, flood risk viewer was produced with modest processing resources involving only a single, quad-core processor with a frequency around 4.30 GHz and with no graphics card. The system offers several major advantages. (1) The approach enables its use on a browser or with inexpensive, virtual reality hardware and, thus, promotes local dissemination for flood risk communication, planning, and mitigation. (2) The approach can be used for any scenario where water interfaces with the built environment, including inside of pipes. (3) When tested for a coastal inundation scenario from a hurricane, 92% of the neighborhood participants found it to be more effective in communicating flood risk than traditional 2D mapping flood warnings provided by governmental authorities. 
    more » « less
    Free, publicly-accessible full text available February 1, 2026
  4. There is increasing evidence that climate change will lead to greater and more frequent extreme weather events, thus underscoring the importance of effectively communicating risks of record storm surges in coastal communities. This article reviews why risk communication often fails to convey the nature and risk of storm surge among the public and highlights the limitations of conventional (two-dimensional) storm surge flood maps. The research explores the potential of dynamic street-level, augmented scenes to increase the tangibility of these risks and foster a greater sense of agency among the public. The study focused on Sunset Park, a coastal community in southwest Brooklyn that is vulnerable to storm surges and flooding. Two different representations of flooding corresponding to a category three hurricane scenario were prepared: (1) a conventional two-dimensional flood map (“2D” control group) and (2) a, dynamic, street view simulation (“3D”test group). The street view simulations were found to be (1) more effective in conveying the magnitude of flooding and evacuation challenges, (2) easier to use for judging flood water depth (even without a flood depth legend), (3) capable of generating stronger emotional responses, and (4) perceived as more authoritative in nature 
    more » « less
  5. A utilidor is a ‘system of systems’ infrastructural solution to the ‘subsurface spaghetti’ problem resulting from direct burial of utility transmission infrastructure beneath the public right of way (PROW). The transition from direct burial to utilidors in older, dense American cities has generally not occurred, despite the potential to increase system performance in a long-term, !nancially and environmentally sustainable manner, because it would require reform of local planning practices and of utility pricing to support !nancing within a complex regulatory system. Utilidor adoption in New York City (NYC) would be a signi!cant local infrastructure transition, amplifying the need for localitybased research, that would occur while each utility sector undergoes its own infrastructure transitions, thereby increasing the level of regulatory complexity. This paper applies transitions analysis, recursive collective action theory, and capacity to act analysis to NYC’s experience with its PROW subsurface spaghetti problem and utilidor implementation to demonstrate a placebased methodology that identi!es speci!c sources of resistance to innovative subsurface design and feasible pathways for resolving them. This methodology would be transferable for application to other American cities or classes of American cities to supplement the limits of generalised subsurface and subsurface resource integration research for practitioner application. 
    more » « less
  6. Aerial images are a special class of remote sensing images, as they are intentionally collected with a high degree of overlap. This high degree of overlap complicates existing index strategies such as R-tree and Space Filling Curve (SFC) based index techniques due to complications in space splitting, granularity of the grid cells and excessive duplication of image object identifiers (IOIs). However, SFC based space ordering can be modified to provide scalable management of overlapping aerial images. This involves overcoming similar IOIs in adjacent grid cells, which would naturally occur in SFC based grids with such data. IOI duplication can be minimized by merging adjacent grid cells through the proposed “Designing Adjacent Cell Merge Algorithm” (DACMA). This work focuses on establishing a proper adjacent cell merge metric and merge percentage value. Using a highly scalable, distributed HBase cluster for both a single aerial mapping project, and multiple aerial mapping projects, experiments evaluated Jaccard Similarity (JS) and Percentage of Overlap (PO) merge metrics. JS had significant advantages: (i) generating smaller merged regions and (ii) obtaining over 21% and 36% improvement in reducing query response times compared to PO. As a result, JS is proposed for the merge metric for DACMA. For the merge percentage two considerations were dominant: (i) substantial storage reductions with respect to both straight forward SFC-based cell space indexing and 4SA based indexing, and (ii) minimal impact on the query response time. The proposed merge percentage value was selected to optimize the storage (i.e. space) needs and response time (i.e. time) herein named the “Space-Time Trade-off Optimization Percentage” value (or STOP value) is presented. 
    more » « less
  7. Abstract Street view imagery databases such as Google Street View, Mapillary, and Karta View provide great spatial and temporal coverage for many cities globally. Those data, when coupled with appropriate computer vision algorithms, can provide an effective means to analyse aspects of the urban environment at scale. As an effort to enhance current practices in urban flood risk assessment, this project investigates a potential use of street view imagery data to identify building features that indicate buildings’ vulnerability to flooding (e.g., basements and semi-basements). In particular, this paper discusses (1) building features indicating the presence of basement structures, (2) available imagery data sources capturing those features, and (3) computer vision algorithms capable of automatically detecting the features of interest. The paper also reviews existing methods for reconstructing geometry representations of the extracted features from images and potential approaches to account for data quality issues. Preliminary experiments were conducted, which confirmed the usability of the freely available Mapillary images for detecting basement railings as an example type of basement features, as well as geolocating the features. 
    more » « less
  8. Current state-of-the-art point cloud data management (PCDM) systems rely on a variety of parallel architectures and diverse data models. The main objective of these implementations is achieving higher scalability without compromising performance. This paper reviews the scalability and performance of state-of-the-art PCDM systems with respect to both parallel architectures and data models. More specifically, in terms of parallel architectures, shared-memory architecture, shared-disk architecture, and shared-nothing architecture are considered. In terms of data models, relational models, and novel data models (such as wide-column models) are considered. New structured query language (NewSQL) models are considered. The impacts of parallel architectures and data models are discussed with respect to theoretical perspectives and in the context of existing PCDM implementations. Based on the review, a methodical approach for the selection of parallel architectures and data models for highly scalable and performance-efficient PCDM system development is proposed. Finally, notable research gaps in the PCDM literature are presented as possible directions for future research. 
    more » « less
  9. State-of-the-art, scalable, indexing techniques in location-based image data retrieval are primarily focused on supporting window and range queries. However, support of these indexes is not well explored when there are multiple spatially similar images to retrieve for a given geographic location. Adoption of existing spatial indexes such as the kD-tree pose major scalability impediments. In response, this work proposes a novel scalable, key-value, database oriented, secondary-memory based, spatial index to retrieve the top k most spatially similar images to a given geographic location. The proposed index introduces a 4-dimensional Hilbert index (4DHI). This space filling curve is implemented atop HBase (a key-value database). Experiments performed on both synthetically generated and real world data demonstrate comparable accuracy with MD-HBase (a state of the art, scalable, multidimensional point data management system) and better performance. Specifically, 4DHI yielded 34% - 39% storage improvements compared to the disk consumption of the original index of MD-HBase. The compactness in 4DHI also yielded up to 3.4 and 4.7 fold gains when retrieving 6400 and 12800 neighbours, respectively; compared to the adoption of original index of MD-HBase for respective neighbour searches. An optimization technique termed “Bounding Box Displacement” (BBD) is introduced to improve the accuracy of the top k approximations in relation to the results of in-memory kD-tree. Finally, a method of reducing row key length is also discussed for the proposed 4DHI to further improve the storage efficiency and scalability in managing large numbers of remotely sensed images. 
    more » « less
  10. We present an overview of four challenging research areas in multiscale physics and engineering as well as four data science topics that may be developed for addressing these challenges. We focus on multiscale spatiotemporal problems in light of the importance of understanding the accompanying scientific processes and engineering ideas, where “multiscale” refers to concurrent, non-trivial and coupled models over scales separated by orders of magnitude in either space, time, energy, momenta, or any other relevant parameter. Specifically, we consider problems where the data may be obtained at various resolutions; analyzing such data and constructing coupled models led to open research questions in various applications of data science. Numeric studies are reported for one of the data science techniques discussed here for illustration, namely, on approximate Bayesian computations. 
    more » « less